Grammatical Evolution-Driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural Networks
نویسندگان
چکیده
Neural networks have revolutionised the way we approach problem solving across multiple domains; however, their effective design and efficient use of computational resources is still a challenging task. One most important factors influencing this process model hyperparameters which vary significantly with models datasets. Recently, there has been an increased focus on automatically tuning these to reduce complexity optimise resource utilisation. From traditional human-intuitive methods random search, grid Bayesian optimisation, evolutionary algorithms, significant advancements made in direction that promise improved performance while using fewer resources. In article, propose HyperGE, two-stage for driven by grammatical evolution (GE), bioinspired population-based machine learning algorithm. GE provides advantage it allows users define own grammar generating solutions, making ideal defining search spaces datasets models. We test HyperGE fine-tune VGG-19 ResNet-50 pre-trained three benchmark demonstrate space reduced factor ~90% Stage 2 number trials. could become invaluable tool within deep community, allowing practitioners greater freedom when exploring complex domains hyperparameter fine-tuning.
منابع مشابه
An effective algorithm for hyperparameter optimization of neural networks
A major challenge in designing neural network (NN) systems is to determine the best structure and parameters for the network given the data for the machine learning problem at hand. Examples of parameters are the number of layers and nodes, the learning rates, and the dropout rates. Typically, these parameters are chosen based on heuristic rules and manually fine-tuned, which may be very time-c...
متن کاملTowards Automatic Generation of Evolution Rules for Model-Driven Optimisation
Over recent years, optimisation and evolutionary search have seen substantial interest in the MDE research community. Many of these techniques require the specification of an optimisation problem to include a set of model transformations for deriving new solution candidates from existing ones. For some problems—for example, planning problems, where the domain only allows specific actions to be ...
متن کاملExploring Grammatical Evolution for Horse Gait Optimisation
Physics-based animal animations require data for realistic motion. This data is expensive to acquire through motion capture and inaccurate when estimated by an artist. Grammatical Evolution (GE) can be used to optimise pre-existing motion data or generate novel motions. Optimised motion data produces sustained locomotion in a physics-based model. To explore the use of GE for gait optimisation, ...
متن کاملAutomatic Programming with Grammatical Evolution
The aim of this work is to develop an Automatic Programming system drawing inspiration from biological genetic systems. Grammatical Evolution (GE), a system which can generate code in any language from variable length binary strings is the result of this investigation. Using a language definition in the form of a Backus Naur Form grammar permits the generation of programs in any language, of ar...
متن کاملEfficient Transfer Learning Method for Automatic Hyperparameter Tuning
We propose a fast and effective algorithm for automatic hyperparameter tuning that can generalize across datasets. Our method is an instance of sequential model-based optimization (SMBO) that transfers information by constructing a common response surface for all datasets, similar to Bardenet et al. (2013). The time complexity of reconstructing the response surface at every SMBO iteration in ou...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Algorithms
سال: 2023
ISSN: ['1999-4893']
DOI: https://doi.org/10.3390/a16070319